NEXT19 | James Bridle | New Dark Age: Is Technology Making the World Harder to Understand?

NEXT Conference
30 Sept 201928:56

Summary

TLDRThe speaker, a technologist with a background in computer science, expresses concern over current technology trends, dubbing it a 'New Dark Age.' They discuss the metaphorical 'cloud,' emphasizing its physical reality and societal impact. Drawing parallels between technology and weather, the speaker touches on the historical connection between computation and weather prediction. They critique the opacity of modern tech, its role in climate change, and societal manipulation, citing examples like Amazon warehouses and YouTube's algorithm. The talk concludes with potential maneuvers for positive tech use, emphasizing the importance of understanding and ethical technology design.

Takeaways

  • 🌐 The term 'cloud' is a metaphor for vast data centers that power our digital interactions, yet its invisibility and mystery are part of its essence.
  • 💡 The historical connection between weather prediction and the development of early computers like ENIAC highlights the intertwined destinies of computation and our understanding of natural systems.
  • 📈 Moore's Law, which predicts the doubling of processing power every two years, has influenced our expectations of technological progress, yet it also masks the environmental and computational challenges we face.
  • 🔍 The increasing opacity of technology, from algorithmic biases to the hidden labor behind apps, is leading to a lack of public understanding and control, fostering confusion and societal issues.
  • 🌡️ Climate change is making weather patterns less predictable, challenging our reliance on historical data for future predictions and underscoring the limitations of technology in addressing global issues.
  • 🛍️ The convenience of modern technologies often obscures the human labor and environmental costs, as seen in 'dark kitchens' and Amazon's warehouse practices.
  • 🤖 Automation bias, where we overly trust technology, can lead to dangerous outcomes, as shown in studies where even trained professionals follow flawed automated advice.
  • 🌍 The Internet and associated technologies are significant, often overlooked, contributors to climate change, challenging the sustainability of our digital practices.
  • 🔒 The design of technology into society is leading to surveillance and control, as exemplified by Amazon's warehouse systems and the potential for social media to manipulate behavior.
  • 🛑 The talk concludes with a call to action to rethink technology not for solutions per se, but for maneuvers that can redirect its impact towards more ethical, understandable, and democratic uses.

Q & A

  • What is the title of the talk and what does it signify?

    -The title of the talk is 'New Dark Age,' which signifies the speaker's deep concern about the trends in technology and their impact on society, despite coming from a background that loves technology.

  • What is the significance of the term 'cloud' in the context of the talk?

    -In the talk, the term 'cloud' is used to describe the misconception of it being a magical, mysterious place where data is stored and processed. It is emphasized that the cloud is actually physical infrastructure owned by companies and has real-world impacts.

  • How is the history of weather prediction related to the development of computation?

    -The history of weather prediction is intrinsically linked to computation as it was one of the first applications that required massive data processing. Lewis Fry Richardson's work in the 1920s laid the foundation for using mathematics and data to predict the weather, which later required computational power that led to the development of early computers like the ENIAC.

  • What is Moore's Law and how does it relate to the talk?

    -Moore's Law is the observation that processing power doubles approximately every two years. The talk discusses how this law has influenced our expectations of technological progress and the belief in the potential of computers to solve any problem, but also points out its limitations and the issues arising from an overreliance on computational power.

  • What is 'Eroom's Law' and how does it contrast with Moore's Law?

    -Eroom's Law is a term coined to describe the opposite trend of Moore's Law, suggesting that the increase in computational power has not led to a proportional increase in drug discovery. It highlights the failure of sheer computational power to solve complex problems like drug discovery.

  • How does the speaker connect technology to climate change?

    -The speaker connects technology to climate change by pointing out that the electricity required for data processing and AI is a significant driver of climate change, and that technology, including the internet, contributes to environmental issues.

  • What is 'clear air turbulence' and how is it related to climate change?

    -Clear air turbulence is a type of atmospheric disturbance that is unpredictable and is becoming more prevalent due to climate change. It is mentioned in the talk to illustrate how our ability to predict weather patterns is being compromised by the changing climate.

  • What is 'automation bias' and how does it relate to technology?

    -Automation bias is a psychological phenomenon where people tend to overly trust automated systems, even when they provide incorrect information. The talk discusses how this bias can lead to dangerous outcomes, as people are becoming overly reliant on technology and losing their ability to think critically.

  • What is the 'Keeling curve' and what does it illustrate?

    -The Keeling curve is a graph that shows the exponential growth of CO2 in the atmosphere, measured at the Mauna Loa Observatory in Hawaii. It illustrates the increasing levels of CO2, surpassing 400 parts per million, which is a critical marker for climate change discussions.

  • How does the speaker discuss the impact of technology on labor and society?

    -The speaker discusses the impact of technology on labor and society by giving examples such as Amazon warehouses, where workers are monitored and directed by algorithms, leading to de-skilling and a lack of autonomy. This, along with other examples like dark kitchens and Pokemon Go, shows how technology is used to hide labor and manipulate consumers.

  • What are the 'three small maneuvers' the speaker suggests as a way to counter the negative impacts of technology?

    -The 'three small maneuvers' suggested by the speaker include using technology to produce different outcomes, repurposing technology for beneficial purposes, and insisting on transparency and understanding in technology design to ensure it serves democratic and ethical values.

Outlines

00:00

🌧️ The Concept of a 'New Dark Age' in Technology

The speaker, a writer, visual artist, and computer scientist, introduces the concept of a 'new Dark Age' in technology, expressing concern about current technological trends despite a background in computer science. The term 'cloud' is dissected, revealing its physical nature as large buildings filled with computers owned by companies with real-world impacts. The speaker emphasizes the importance of understanding technology as an extension of ourselves and society, drawing parallels between technology and natural systems, and highlighting the historical connection between weather prediction and computation. The talk aims to explore the origins of the 'new Dark Age' phrase and its implications on our perception and interaction with technology.

05:00

🌐 The Illusion and Reality of the Cloud and Moore's Law

The speaker discusses the cloud as a metaphor for technology's intangible nature, contrasting its ethereal public perception with its physical reality as massive data centers. The historical development of weather prediction and its reliance on computational power is explored, from Lewis Fry Richardson's manual calculations during World War I to the first computerized weather forecasts using the ENIAC. The speaker then transitions to Moore's Law, which predicts the doubling of processing power every two years, and its influence on our expectations of technological progress. However, the speaker introduces 'Eroom's Law' as a counterpoint, illustrating how increased computational power has not led to proportional gains in drug discovery, suggesting limitations in our reliance on technology for problem-solving.

10:01

🌡️ Climate Change and the Unpredictability of Modern Technology

The speaker addresses the impact of climate change on the unpredictability of weather, particularly clear air turbulence, and how it challenges our reliance on historical data for future predictions. The speaker also discusses the role of technology, including the internet and big data processing, as significant contributors to climate change. The 'Keeling Curve' is mentioned to illustrate the exponential growth of CO2 in the atmosphere, and the speaker connects high indoor CO2 levels to decreased cognitive function, suggesting that our technological advancements may be inadvertently hindering our ability to address these challenges.

15:04

🤖 Automation Bias and the Hidden Consequences of Technology

The speaker delves into the psychological phenomenon of automation bias, where people tend to trust technology even when it leads to incorrect decisions. Examples such as 'death by GPS' and the over-reliance on automated systems in various fields are given to illustrate the dangers of this bias. The speaker also discusses the societal impacts of technology, such as the dehumanizing effects of Amazon's warehouse system and the exploitation of workers in 'dark kitchens' for food delivery apps. The paragraph highlights how technology is designed to hide the labor and human costs behind the convenience it provides.

20:05

🕹️ The Manipulation of Reality and Public Opinion Through Technology

The speaker explores how technology is used to manipulate public opinion and reality, citing examples like Pokémon Go, which directs users to advertiser locations, and YouTube's autoplay and recommendation algorithms, which can radicalize viewers by promoting sensational or false content. The speaker also discusses the role of technology in politics, such as the Internet Research Agency's disinformation campaigns aimed at sowing confusion and mistrust. The paragraph emphasizes the ease with which people can be deceived by technology and the broader societal effects of these manipulations.

25:08

🛰️ Redirecting Technology for Positive Change

The speaker concludes with a discussion on how technology can be repurposed for positive outcomes. Examples include London delivery drivers using an app to organize a strike and the repurposing of US spy satellites into scientific instruments for space exploration. The speaker argues that technology itself is not the problem but rather how it is used and designed. The speaker advocates for a return to transparent, educative, and ethical technology that empowers rather than confuses or controls people, emphasizing the importance of understanding and participating in the technologies we use.

Mindmap

Keywords

💡New Dark Age

The term 'New Dark Age' refers to a potential future where our reliance on technology leads to a lack of understanding and control over our own lives. In the video, the speaker expresses concern about the direction technology is taking us, suggesting it could lead to a period of ignorance and helplessness, similar to the historical Dark Ages. The phrase is used to provoke thought about the implications of technology becoming too complex or opaque for the average person to understand, potentially leading to a loss of agency and wisdom.

💡Cloud

In the context of the video, 'cloud' refers to cloud computing, which is often perceived as a mysterious and intangible service where data and processes are managed remotely. The speaker points out the irony in calling it 'cloudy,' as it is actually composed of large, physical data centers. The cloud's invisibility can lull us into a false sense of security or understanding, detaching us from the tangible realities of data storage and processing.

💡Technology

Technology, as discussed in the video, encompasses the tools, systems, and machines that we create and use to solve problems or perform tasks. The speaker argues that when we talk about technology, we are often actually discussing human society and our interactions with each other. Technology is not just about the devices themselves but also the societal impacts they have, which can be both positive and negative.

💡Moore's Law

Moore's Law is the observation that the number of transistors on a microchip doubles approximately every two years, which has historically meant that computing power also doubles in that time frame. In the video, the speaker uses Moore's Law to illustrate an expectation of continuous technological advancement. However, the speaker also points out that this law may not hold true indefinitely, and our reliance on it could lead to unforeseen consequences.

💡Eroom's Law

Eroom's Law is a humorous reversal of Moore's Law, coined to reflect the observation that the rate of improvement in fields like drug discovery has slowed down despite increasing computational power. The speaker uses this term to highlight the limitations of technology and computation in certain areas, suggesting that not all problems can be solved simply by adding more computing power.

💡Climate Change

Climate change is a central theme in the video, where the speaker discusses how technology contributes to environmental issues, such as the energy consumption of data centers and the internet. The speaker also connects climate change to the unpredictability of weather patterns, which challenges our traditional methods of prediction and control, emphasizing the complex relationship between technology and the natural world.

💡Automation Bias

Automation bias refers to the human tendency to over-rely on automated systems, even when they provide incorrect or misleading information. The speaker cites studies where pilots and other trained professionals follow automated systems' bad advice, leading to errors. This bias is a warning about our increasing dependence on technology and the potential dangers of ceding too much control to machines.

💡Dark Kitchens

Dark kitchens are commercial kitchens that operate solely for the purpose of preparing food for delivery orders, often hidden from public view. The speaker uses dark kitchens as an example of how technology-driven convenience can obscure the labor conditions and human costs behind the scenes, highlighting the ethical implications of technological convenience.

💡Algorithm

An algorithm, in the context of the video, refers to a set of rules or processes used by computers to perform tasks or make decisions. The speaker discusses how algorithms can be designed to optimize for certain outcomes, such as user engagement, which may not align with ethical or societal values. This leads to concerns about the power and influence of algorithms in shaping our experiences and beliefs.

💡Disinformation

Disinformation is the deliberate spread of false or misleading information, often with the intent to influence public opinion or obscure the truth. The speaker mentions the Internet Research Agency as an example of state-sponsored disinformation campaigns, which use technology to sow confusion and mistrust. This keyword is crucial for understanding the video's message about the manipulation and misuse of technology in society.

💡Opacity

Opacity in the video refers to the lack of transparency or clarity in how technology works, particularly in systems that are complex or intentionally obfuscated. The speaker argues that the opacity of technology can lead to a lack of understanding and control, which in turn can result in confusion, fear, and anger. The concept is central to the video's argument that technology should be more accessible and understandable to the public.

Highlights

The speaker identifies as a technologist with a background in computer science, expressing concern about current technology trends.

The term 'new Dark Age' is introduced to describe the potential negative trajectory of technology's impact on society.

The 'cloud' is discussed as a metaphor for the intangibility and mystery surrounding modern technology.

The historical connection between weather prediction and the development of early computers is highlighted.

Lewis Fry Richardson's work in numerical weather prediction is cited as a foundational moment in computational science.

The ENIAC computer's role in both weather prediction and atomic bomb development is noted.

Moore's Law and its influence on expectations of computational power and societal progress are critiqued.

Eroom's Law is introduced as a counterpoint to Moore's Law, illustrating the diminishing returns in drug discovery.

The unpredictability of weather due to climate change challenges the traditional methods of data-based forecasting.

The Internet's significant contribution to climate change is discussed, contrasting its role as a tool for potential solutions.

The 'death by GPS' phenomenon illustrates the dangers of over-reliance on technology.

Automation bias is described, explaining how people tend to trust technology even when it leads to incorrect decisions.

The Amazon warehouse example is used to illustrate the dehumanizing effects of technology on labor.

Dark kitchens are presented as an example of how technology hides the labor behind convenience.

Pokemon GO is criticized for manipulating users' movements for advertising purposes without their knowledge.

YouTube's autoplay and recommendation algorithms are criticized for promoting misinformation and radical views.

The Ashley Madison data leak is used as an example of how easily people can be deceived by technology.

The Internet Research Agency's tactics to spread disinformation and confusion online are discussed.

The potential for technology to generate fake realities, such as deepfakes, is highlighted as a threat to trust.

The speaker argues against 'solutionism', suggesting that technology should be used to produce different outcomes.

The repurposing of a US spy satellite for scientific research is given as an example of positive technological redirection.

The speaker concludes by emphasizing the importance of understanding and ethical use of technology, rather than fearing it.

Transcripts

play00:00

I am unlike I think most of the people

play00:16

who've appeared on the stage so far

play00:18

today as someone who's a works for a

play00:22

technology company or has something to

play00:26

sell in fact I'm a I'm a writer I'm a

play00:28

visual artist I'm a computer scientist

play00:32

by training I consider myself to be a

play00:34

technologist a kind of vague word that

play00:36

means I'm just really really interested

play00:38

in this stuff the title of this talk is

play00:41

new Dark Age which sounds grim because

play00:44

it is because I find myself despite

play00:49

coming from a as I say a background in

play00:51

computer science a background as an

play00:53

internet hippie a kind of joyous nerd

play00:56

someone who loves technology to be

play00:58

deeply concerned about many of the

play01:00

trends we see in technology and thus in

play01:04

the world today and I'm going to talk a

play01:06

little bit about where that phrase comes

play01:08

from and what I mean by it I'm

play01:11

interested in the way that we talk about

play01:14

technology primarily because I think

play01:16

when we talk about technology we're

play01:17

usually really just talking about us and

play01:19

it's in our society and the way that we

play01:21

that we interact with one another and so

play01:24

an interesting place to start with this

play01:25

idea of a new Dark Age is with the cloud

play01:29

which is such a fascinating term to me

play01:32

we hear this talk of the cloud as though

play01:35

it's some sort of magical mysterious

play01:37

faraway place where stuff just happens

play01:39

you know magically and beautifully we

play01:42

upload our photos we tell it our secrets

play01:44

we talked to our friends through it we

play01:45

give it a lot money but it's sort of

play01:48

nebulous and invisible and yet it's

play01:50

always really important to remember that

play01:52

the cloud is actually very solid it's

play01:54

huge buildings filled with computers

play01:57

that are owned by companies that exist

play01:59

within legal jurisdictions within

play02:02

particular geographies that have an

play02:04

impact on the world in in many ways that

play02:07

we'll talk about but then also I always

play02:10

want to insist that

play02:11

that name still tells us something

play02:14

important about how we interact with it

play02:16

that it remains sort of cloudy and it's

play02:18

cloudiness the very uncertainty that it

play02:21

brings is is super important and I like

play02:24

this image of Technology and the weather

play02:27

technology relating it to natural

play02:29

systems I found looking at that and

play02:31

particularly in its history to be super

play02:33

productive in thinking about these

play02:35

things because there's always been you

play02:37

know at least for the last century a

play02:38

very interesting and tight relationship

play02:41

between our ideas about the weather and

play02:43

and computation itself this is a few

play02:47

pages from one of my favorite books this

play02:49

is Lewis fry Richardson 1922 text

play02:51

weather prediction by numerical methods

play02:54

Richardson was one of the first

play02:56

scientists mathematicians to argue that

play02:59

would be possible to predict the weather

play03:00

through data this at the time was a

play03:03

super radical idea and no one believed

play03:06

that the the the natural environment was

play03:08

kind of susceptible to mathematics in

play03:10

this way but which since proposed a

play03:13

method which really came to define all

play03:15

of computation which is that if you

play03:17

divide the world up into discrete boxes

play03:19

take certain measurements produced data

play03:21

you can then compute that data and

play03:23

predict the future Richardson did this

play03:26

he wrote a book on it before he wrote

play03:27

the book he actually did a full weather

play03:29

calculation he took all of this data for

play03:32

Western Europe and he worked out point

play03:34

by point what the weather would be like

play03:36

but this was before computers he did

play03:38

this with pen and paper and it took him

play03:41

about three months to do a single daily

play03:43

forecast he also did it under shellfire

play03:44

because it was an ambulance driver in

play03:46

the first world war at the time it was

play03:48

an astonishing achievement but he didn't

play03:50

really imagine that this would actually

play03:52

be effective because he didn't foresee

play03:53

computers in fact it took another 30 odd

play03:56

years before this was transformed into a

play03:58

computer program this is the same or

play04:01

very similar weather forecast performed

play04:03

on a computer and it's in fact the first

play04:05

24-hour forecast that ran quicker than

play04:07

24 hours because we finally had

play04:09

computational power that would keep up

play04:11

with the actual weather and this is the

play04:14

computer that was performed on this is

play04:16

the ENIAC one of the very earliest

play04:18

computers that was built in the US

play04:19

during the Second World War

play04:21

ENIAC one of my absolute favorite

play04:23

computers can nerd out of this

play04:25

you want and it's a beautiful thing it

play04:27

took up two whole rooms here at the

play04:29

Aberdeen Proving Grounds in Maryland and

play04:32

it was invented basically to do two

play04:33

things

play04:34

to predict the weather and to build

play04:36

atomic bombs that's basically where

play04:38

computers come from from weather

play04:40

prediction and at building atomic

play04:42

weapons and they contain that history

play04:44

within them to this day there's a

play04:46

beautiful little story by one of the

play04:49

engineers who first worked on the on the

play04:52

ENIAC a guy called Harry Reid in his

play04:54

kind of a farewell address he says this

play04:57

beautiful thing where he said working on

play05:00

the ENIAC was kind of like living inside

play05:03

the computer because it completely

play05:07

contained you it was a very personal

play05:09

relationship and now we think of a

play05:11

personal computer is something very

play05:12

small that we carry around with us at

play05:14

all times but actually that's not really

play05:17

true this room sized computer didn't

play05:20

shrink down at all rather it expanded

play05:23

and now it includes the whole of the

play05:25

planet including even goes up into house

play05:28

space in the form of satellites we all

play05:30

live inside that computer that Harry

play05:33

Reid and others envision in the 1950s

play05:35

and it effects every aspect of our lives

play05:38

it also really affects the way we kind

play05:41

of view the world and have expectations

play05:43

of it in the future this a graph that

play05:46

many of you I'm sure familiar with is

play05:47

Moore's law the the rule of thumb

play05:50

developed in the 1950s that would say

play05:51

that said the processing power would

play05:53

double every two years and amazingly it

play05:56

has held true ever since Gordon Moore

play05:58

the guy who who came up with this and

play06:00

from Intel it was just a rule of thumb

play06:03

it was just something he observed he

play06:04

didn't really expect it to last and yet

play06:07

it has and it's kind of got inside our

play06:09

heads it's produced a kind of idea of

play06:12

the world that if we only have more

play06:13

computers and more processing power will

play06:16

basically be able to achieve anything

play06:17

we'll always have this fuel for total

play06:20

expansion and that's meant to be a

play06:21

slightly worrying phrase for the fuel

play06:24

for continuous and ever-growing

play06:26

expansion is something that's actually

play06:28

causing quite severe problems for us in

play06:30

the present here's a graph that goes the

play06:33

other way so not Moore's law going

play06:35

always up and to the right

play06:36

but this is a graph of something that

play06:38

people in the pharmacological sciences

play06:40

people are working on development of new

play06:42

drugs coined eroom's law that's Moore's

play06:45

law backwards because this is a

play06:47

discovery they've made that in fact over

play06:49

the last 20-30 years as more and more

play06:52

computers have been thrown at the

play06:53

problem of drug discovery the results

play06:55

have actually got less we're discovering

play06:58

that the larger and larger datasets and

play07:00

more and more powerful computation are

play07:02

not actually helping us with forms of

play07:04

discovery that we need it's actually

play07:06

getting harder and harder to sort

play07:08

through this data even with the newest

play07:11

tools purely using computational methods

play07:13

what's fascinating about this is that

play07:16

many pharmacological companies are now

play07:17

actually changing their practices so

play07:19

they don't just rely on kind of massive

play07:21

data sets and powerful computational own

play07:23

but actually start to return to the idea

play07:25

of having small teams of scientists

play07:27

working on hunches essentially working

play07:29

on field using their own human

play07:31

experience in opposition to the purely

play07:33

computational thinking and this failure

play07:37

of computation alone to accurately

play07:40

predict and assist us in the future is

play07:42

being replicated everywhere in fact in

play07:46

the very first thing that we set out to

play07:47

measure and predict in the first place

play07:49

which is the weather these images of

play07:52

turbulence in the North Atlantic from

play07:54

the latest research papers the the

play07:58

atmosphere as I'm sure you're aware due

play08:00

to climate change is warming not

play08:03

predicted not you know predictably or

play08:05

unpredictably in the future but right

play08:07

now these are graphs of the current

play08:09

situation as the atmosphere warms air

play08:12

masses in their behavior become less

play08:15

predictable as a result huge areas of

play08:18

atmosphere sheer against each other

play08:20

producing what's called clear air

play08:22

turbulence clear air turbulence is

play08:24

specifically the turbulence that comes

play08:26

out of clear air it's totally

play08:28

unpredictable and it's getting worse as

play08:30

a result of climate change and we can't

play08:32

predict it or the other whether that's

play08:34

occurring because the only thing we have

play08:36

to go on is past data which because of

play08:39

climate change is no longer the case our

play08:42

entire practice of using past data to

play08:45

predict the future is starting to fail

play08:47

because of climate change

play08:49

also as a result of these technological

play08:51

ideas we've inherited from the history

play08:53

of Technology itself and in fact of

play08:55

course technology is one of the main

play08:57

drivers of climate change not just

play08:59

generally in terms of the legacy

play09:02

technologies of fossil fuels that we use

play09:04

all the time but also in contemporary

play09:06

technologies the Internet is a vast

play09:08

driver climate change in itself the

play09:10

electricity required to do all the kinds

play09:13

of big data or AI processing you might

play09:15

be here today is a massive driver of

play09:17

climate change equivalent at least to

play09:19

the whole of the airline industry so

play09:21

we're already contributing to this

play09:23

through the networks that we're hoping

play09:25

will kind of get us out of it

play09:27

and we're not going to be able to think

play09:29

about this at all much more clearly for

play09:31

very long this is another graph going up

play09:34

into the right

play09:35

this is co2 as measured at the morning

play09:37

lower Observatory in Hawaii it's called

play09:39

the Keeling curve it shows the

play09:41

exponential growth in co2 in the

play09:43

atmosphere that's been ongoing for over

play09:46

well Ferb forever essentially but

play09:48

increasing exponentially in the last 50

play09:50

years what this graph shows is that we

play09:53

surpassed 400 parts per million in the

play09:54

atmosphere a couple of years ago what it

play09:56

doesn't show is that indoor co2

play09:58

regularly passes a thousand parts per

play10:01

million it's probably somewhere close to

play10:02

that in this room now it's not a lot of

play10:04

ventilation we're all breathing in here

play10:07

over-over excuse me a thousand parts per

play10:10

million human cognitive ability drops by

play10:13

20% you're dumber by breathing in co2

play10:17

and we're actively increasing the co2 in

play10:20

the atmosphere it's getting harder and

play10:22

harder to think and our technology is

play10:24

effectively contributing to this at

play10:27

present one of my favorite examples of

play10:29

this is a phenomena that Park Rangers in

play10:31

the u.s. named death by GPS this is when

play10:34

people have become so accustomed to

play10:37

trusting into the technological systems

play10:39

that they've been given that they follow

play10:41

them wherever they go

play10:43

they've had people die in the middle of

play10:46

Death Valley because they've driven

play10:47

their cars down dirt tracks following

play10:49

that little bright line or examples of

play10:52

people driving into rivers or into the

play10:54

sea because the map shows that that's

play10:56

where the road goes we've we've given so

play11:00

much over to these computational systems

play11:02

that we're losing

play11:02

our ability to think for ourselves and

play11:05

in case you think that's just something

play11:06

that basically stupid people do it's

play11:09

actually an example of something that's

play11:10

quite well-known in psychological and

play11:13

neurological studies

play11:14

it's called automation bias it means

play11:17

basically that we trust technology or

play11:19

well they're like but even us like deep

play11:20

structures in our brain do this they've

play11:23

put pilots inside highly sophisticated

play11:25

simulators very well trained pilots

play11:28

people with thousands of hours of flying

play11:30

experience and they've put ins of you

play11:32

simulated emergencies in which the

play11:35

pilots know exactly what they need to do

play11:37

and then they put in some kind of

play11:39

automated system that at a critical

play11:40

moment suggests they do the wrong thing

play11:42

and 90% of the time even highly trained

play11:45

people follow the bad advice with

play11:48

hideous consequences when given that

play11:51

advice by an automated system that they

play11:53

trust our brains basically like the easy

play11:57

way out they're very easy to

play11:59

short-circuit particularly with

play12:01

technology and things in which we trust

play12:06

of course not all of us even have the

play12:08

choice about following these automated

play12:10

systems because of the ways that we're

play12:12

designing them into society this is a

play12:15

picture of an Amazon warehouse which is

play12:18

an absolutely kind of fascinating

play12:19

infrastructure that displays some of the

play12:21

qualities of the kind of spaces that I'm

play12:24

talking about and this is one of the

play12:26

places where you have an any act like

play12:29

room of computation although you

play12:30

wouldn't really know by looking at it

play12:32

that you're looking at a computer but

play12:34

you kind of are Amazon uses this really

play12:36

extraordinary thing called chaotic

play12:39

storage so what happens is basically if

play12:43

you're a large e-commerce company and

play12:45

you have millions if not billions of

play12:47

things for sale people don't order them

play12:48

alphabetically right they don't order

play12:50

them nicely essentially they all do a

play12:52

bunch of random stuff and if you have a

play12:54

vast warehouse you don't want the people

play12:56

picking those things to have to walk

play12:58

like two kilometers this way to get that

play13:00

thing and then two kilometers back this

play13:01

way right you need to group them

play13:03

according to how people actually order

play13:06

them and they use an algorithm to do

play13:08

this the chaotic storage algorithm which

play13:10

means basically on these shelves you

play13:13

might find a book next to a DVD next

play13:15

some cleaning products next

play13:16

and bath products whatever it is the

play13:19

algorithm has decided are likely to be

play13:21

bought together the result of this is

play13:23

that this space is completely

play13:25

unnavigable to humans it looks like

play13:27

chaos you have no chance of finding

play13:29

something in this which is why employees

play13:32

who work on this warehouses wear wrist

play13:34

mounted devices that guide them like

play13:36

gps's around the space they're

play13:39

completely automated by a machine the

play13:42

side-effects of this of course are that

play13:44

it's also possible to monitor the

play13:47

employees totally to know how long their

play13:49

lunch breaks are when they take toilet

play13:50

breaks are it's possible to D skill the

play13:53

workforce there's no longer any

play13:54

incentive to educate your workforce when

play13:56

all they have to do is follow a guide

play13:58

like this and and it's also very

play14:00

possible to monitor them and keep an eye

play14:02

on possibly who's talking to each other

play14:04

who's planning to unionize to have full

play14:06

and utter total control over your

play14:08

employees and everything outside that as

play14:10

I say including the the lack of any

play14:13

necessity of educating your employees is

play14:16

damaging to society more broadly as well

play14:18

there's something very weird I think

play14:20

that we're designing so many of the

play14:23

technological tools we use every day to

play14:24

effectively hide things from us right

play14:27

these are these are what are known as

play14:29

dark kitchens if you use a delivery app

play14:32

in a big major city in the UK it's

play14:35

things like uber eats and delivery and

play14:37

the demand has so far outstripped the

play14:40

supply possible from the restaurants the

play14:42

popular restaurants basically put up

play14:43

these containers in car parks where you

play14:46

have chefs working 12-hour shifts to

play14:48

provide meals for for the orders the

play14:52

distance between this and the image that

play14:55

we sell of technological convenience is

play14:58

extraordinary I find it amazing that we

play15:00

put so much effort into hiding the labor

play15:03

and the work that actually goes in to

play15:06

providing us the lives we want and of

play15:08

course this isn't just the level of

play15:10

delivery apps this is happening across

play15:11

almost everything that we do for

play15:13

everything that's made technologically

play15:15

convenient something is hidden and it's

play15:17

usually people who are worse off and

play15:19

getting worse off because of these tools

play15:22

the examples of this just extraordinary

play15:25

I am fascinated by the phenomenon of

play15:28

Pokemon go

play15:30

I'm sure there's some people here who

play15:31

play this but I'm also highly aware that

play15:34

most people who play it are not aware

play15:36

that many of the locations that they are

play15:39

taken to by the app have been sold to

play15:42

advertisers so that you literally follow

play15:45

to find your pokemon gym or the or the

play15:47

you know high-value Pokemon and you

play15:51

suddenly find yourself deposited at the

play15:53

front door of a fast-food restaurant or

play15:55

a particular shop that's been personally

play15:57

identified for you by the various

play16:00

analytics that the company holds on you

play16:02

and that it a granade sfrom everywhere

play16:04

else you think there's just not just the

play16:06

Amazon workers who are being directed

play16:08

around step by step for your convenience

play16:10

it's also millions of millions of people

play16:12

playing alternative reality games who

play16:14

have literally no idea that they're

play16:15

being walked through the world and

play16:17

guided through it by forces that they

play16:19

have literally no idea about and this

play16:23

has in other places absolutely

play16:26

devastating societal effects one of the

play16:29

places in which this plays out

play16:30

incredibly clearly is in YouTube which

play16:34

is frankly a cesspit of awful things but

play16:36

on what's particularly unpleasant tactic

play16:40

is what is autoplay and its suggestion

play16:43

algorithm autoplay and YouTube

play16:45

suggestion algorithm are completely

play16:47

uncoupled from any kind of idea of

play16:50

societal value your ethics and this is

play16:52

exhibit a this is Walter Cronkite

play16:54

talking very sensibly very

play16:57

straightforwardly about climate change

play16:59

back in 1980 and these are the

play17:02

suggestions that YouTube thinks that you

play17:04

should watch next the succession of

play17:06

videos debunking or attempt or claiming

play17:09

to debunk climate change to say that

play17:11

this is not real that this is not

play17:12

something you listen to this is the

play17:13

suggestion from a vast corporation that

play17:15

has millions and millions of people

play17:17

following it too and shapes their

play17:21

opinions and their thoughts in very real

play17:23

ways and there's a there's a there's

play17:29

something that's happening here that's

play17:30

actually quite well-documented there's a

play17:32

lot of papers on this basically what you

play17:34

have is you have an algorithm that's

play17:35

been optimized for people's attention

play17:38

and that's it all it wants to do is to

play17:41

keep you watching for longer and it's

play17:43

discovered

play17:43

by accident but very realistically that

play17:46

what people want is kind of contrary

play17:48

opinions what they want his sensation or

play17:50

they want is the discovery that they

play17:52

know something that other people don't

play17:53

it's a very basic human desire and so uh

play17:56

YouTube essentially radicalizes people

play17:59

it essentially takes you on a journey

play18:01

from what may start in a very innocuous

play18:03

place or even a sensible scientific

play18:05

place and deliberately moves you into a

play18:08

place of increased political paranoia

play18:10

and uncertainty and falsity and this is

play18:13

this is not what the algorithm was

play18:16

designed to do but it's what happens

play18:18

when technology is decoupled from any

play18:21

wider context or social ethics and the

play18:25

problem is for this

play18:27

we're so easy to do this to my favorite

play18:30

example of this men let's say are very

play18:33

easy to do this to I'll narrow it down

play18:34

to that and some of you may heard of

play18:37

Ashley Madison which is was was a dating

play18:41

website for people who wanted have

play18:43

affairs people who wanted have affairs a

play18:45

few years ago there was a they had a

play18:48

huge data leak hackers got in they took

play18:51

everything out they put it all on the

play18:53

Internet it was very very embarrassing

play18:55

for a lot of people but researchers went

play18:58

through the data and they discovered

play19:00

that though this was a website for men

play19:02

and women it probably won't surprise you

play19:04

too much to hear that 90% of its users

play19:06

were men and in fact when they looked at

play19:10

the accounts that were registered as

play19:12

female they discovered that only about a

play19:15

thousand of those were active the rest

play19:18

of them were people who logged on once

play19:19

gone you know and walked quietly away as

play19:21

they should have done but those thousand

play19:23

accounts each of them sent tens of

play19:25

thousands of messages a day they were

play19:27

completely automated and this site was

play19:31

making millions and millions of dollars

play19:33

it convinced millions and millions of

play19:36

men to have sexy conversations with bots

play19:39

and they were paying for it people are

play19:41

incredibly easy to trick in such a way

play19:45

there's certain things that work better

play19:47

than others but this attends as I say

play19:49

across pretty much all of our our social

play19:52

networks and really therefore into

play19:54

politics and society

play19:55

broadly it's not just about trickery

play20:01

either it's not just about confusing

play20:04

people or getting them to do something

play20:06

or change their minds a lot of a lot of

play20:08

the arguments around the role of

play20:09

technology that that play out in the

play20:12

political sphere are this idea that

play20:14

we're going to you know change people's

play20:15

minds and make them do something they

play20:17

wouldn't do anyway that's not always the

play20:19

intention and this is really key and the

play20:22

intention is often just confusion itself

play20:25

this is the building in Moscow that

play20:28

houses something called the science and

play20:29

Petersburg that houses something called

play20:31

the Internet research agency which is a

play20:34

Russian government-funded

play20:35

disinformation machine essentially

play20:38

hundreds of people work there is

play20:40

essentially professional trolls leaving

play20:42

comments on on websites sharing

play20:46

disinformation online and so on and so

play20:48

forth there's a really interesting

play20:49

interview with someone who worked at the

play20:51

internet research agency who describes

play20:54

their strategy which actually describes

play20:55

the kind of Russian government strategy

play20:58

in general and probably a number of

play20:59

other states they said like we realized

play21:02

a long time ago it actually wasn't

play21:03

possible to change people's minds what

play21:05

we want to do is we want to muddy the

play21:07

waters we want to make the Internet so

play21:09

horrible that nobody's sensible war and

play21:11

I have anything to do with it they're

play21:13

just gonna poison the discourse and

play21:16

that's very easy to do because it's so

play21:18

easy to manipulate and be manipulated

play21:20

through these systems and because

play21:22

there's for most people so hard to

play21:24

understand and so poorly designed and so

play21:26

much we know is made invisible by them

play21:29

that it's very hard for us to make

play21:31

informed decisions or to think clearly

play21:33

about what's actually happening on these

play21:35

systems and we're doing this

play21:38

deliberately right we're continuing to

play21:40

do this all the time I I find it

play21:43

extraordinary that so much contemporary

play21:45

attention is paid to systems which are

play21:48

increasingly intended to confuse us

play21:52

further

play21:52

these are outputs from a neural network

play21:54

this is a presentation I saw a few weeks

play21:56

ago that showed how brilliantly a bunch

play21:59

of researchers could generate entirely

play22:01

new faces fake people that were not in

play22:04

fact none of these are photographs of

play22:06

actual people these are all

play22:07

computer-generated images which

play22:09

then be masked on to film or on to

play22:12

television or into the news to produce

play22:14

entirely fake realities and this is also

play22:17

what's coming down the line if it's not

play22:18

as I very much suspect pretty much in

play22:21

operation already the point in which

play22:22

Trust breaks down across pretty much all

play22:25

frontiers and for me this is really

play22:31

critical to our understanding of the

play22:34

world today when people don't understand

play22:36

how the things they use actually work

play22:39

when they know that information is being

play22:41

deliberately withheld from them that's

play22:43

deeply distressing it undermines our

play22:47

sense of self and our sense of agency

play22:49

and the result of that is confusion it's

play22:53

fear and it's anger which are as I'm

play22:56

sure you'll agree the dominant emotions

play22:58

of social and political life across most

play23:01

of our societies today I think there's a

play23:03

concrete and causal relationship between

play23:05

the opacity of the technologies we use

play23:08

every day the way those things are

play23:10

constructed and the the lack of general

play23:13

understanding and the lack of ethics

play23:15

that exists within these technologies

play23:17

that we use today we live in confused

play23:20

and fearful times and it's producing the

play23:23

politics we have and we're failing in

play23:25

our duty to assist with that education

play23:28

and with that understanding that would

play23:31

change that so I talked about this this

play23:35

darkness quite a lot and I don't have a

play23:38

lot of time left and I don't want to go

play23:40

out in a complete bummer because I do

play23:41

that all the time and it's something to

play23:42

do my head in as much as I'm sure it

play23:44

does in yours and so I've been I've been

play23:47

trying to think about how I end this

play23:48

talk without without it being completely

play23:50

awful before you have lunch and I'm

play23:53

really opposed to the idea of solutions

play23:55

I'm opposed to solution ism in general

play23:57

because for me it's part of the problem

play23:59

the idea that I here's a problem or we

play24:01

build an app for that is exactly the

play24:03

problem that we're in so I don't talk

play24:05

about solutions I don't talk about

play24:06

answers to this I talked about maneuvers

play24:09

and here am i here are three small

play24:11

maneuvers or three small stories that

play24:13

possibly undo some of that other stuff

play24:15

the first one is how we can use the

play24:18

technologies we have to produce

play24:20

radically different outcomes and my

play24:22

favorite example

play24:22

this is a strike by delivery drivers in

play24:24

London as I said earlier when you're

play24:26

following that little device around when

play24:28

you have no control over your direction

play24:30

and your contact it can be very hard to

play24:33

create a union to argue for better

play24:36

working conditions to confront your boss

play24:39

so what delivery drivers in London did

play24:41

was a few of them have managed to get

play24:43

together on an online forum and they

play24:45

went to the the company's offices and

play24:47

they stuck they started using the app to

play24:50

order pizza to them okay so they they

play24:54

started to get more and more drivers to

play24:56

come in and they built a protest through

play24:58

the app by using it to actually

play25:00

introduce them to other workers rather

play25:03

than alienating them from them another

play25:07

example is of repurposing or rethinking

play25:10

what it is we want these technologies to

play25:12

be doing to literally turn them around

play25:14

this is a a US spy satellite these are

play25:17

vast incredible technologies of

play25:20

extraordinary power that are like you

play25:24

know the original ENIAC computer mostly

play25:26

designed to be pointed at us to be used

play25:29

as weapons something really weird

play25:31

happened a few years ago something I

play25:33

really loved I imagine it happening a

play25:34

bit like this and macey at a conference

play25:37

or something someone from the National

play25:40

geospatial agency which is like the even

play25:43

more secret than the NSA us by satellite

play25:46

agency sidled up to someone from NASA

play25:48

and when John a couple of satellites and

play25:51

they were like right and basically

play25:53

turned out the National geospatial

play25:54

agency had two hubble quality space

play25:57

telescopes sitting on the shelf that

play26:00

they never used and were clearly

play26:02

obsolete and they clearly got something

play26:03

way better and more scary now but they

play26:05

donated these two satellites to NASA

play26:08

NASA basically said yes we'll take them

play26:10

they're currently repurposing one of

play26:12

them into something called the wide

play26:14

field in Ferrari spectrometry yes but a

play26:17

telescope basically a new incredibly

play26:20

powerful space telescope they're going

play26:21

to use to search for new galaxies for

play26:24

for new inhabitable worlds for

play26:25

incredible scientific achievements but

play26:27

it's this beautiful image of like taking

play26:30

a technology that was designed to be

play26:32

aimed down at us

play26:34

right and just literally flipping it

play26:36

around and looking out and seeing what

play26:38

we could discover instead with us for

play26:40

all of our benefits and finally I want

play26:43

to insist that none of this is about the

play26:46

technology itself the technology itself

play26:48

is not inherently opaque it's not

play26:51

inherently complex and impossible to

play26:52

understand it's not inherently dangerous

play26:54

and this has always been true right this

play26:58

is this is what I think I was one of the

play26:59

first technologies of democracy this is

play27:02

a thing called the claret area which is

play27:04

AB largest stone which used to stand in

play27:06

the ancient Agora of Athens in 300 BC

play27:09

when they invented democracy right and

play27:12

it was a machine for administering

play27:14

democracy right in a beautiful way

play27:16

the entire suffrage which to be clear

play27:19

was only free adult property-owning men

play27:22

we can do a lot better but the entire

play27:24

suffrage would come down and those who

play27:26

were chosen by lot would insert little

play27:29

ID tags into the front of this stone and

play27:31

someone would pour a set of balls down a

play27:34

tube and according to the colors of

play27:36

those balls black and white be the

play27:38

people whose ID tags corresponded would

play27:40

be put in charge right they actually had

play27:42

a system that was based on what's called

play27:44

sortition rather than elections our

play27:46

people in choice were chosen by lot

play27:47

which is also a thing that I totally

play27:49

think should come back and but my point

play27:51

is this the technology that ran that

play27:54

democracy was something that stood in

play27:57

the middle of the market square that was

play27:59

visible to everyone that anyone could

play28:01

come down and they could see it at work

play28:02

and they could understand and fully

play28:04

participate in the system that they're

play28:06

engaged in it's a simple technology it's

play28:09

got more complicated but the values it

play28:10

holds don't change we can think about

play28:13

the context of the things that we build

play28:14

and we can work to make them educative

play28:16

or more more just an increasing of

play28:20

equality rather than intending to

play28:22

confuse and intending to overcome to

play28:24

predict take the place of people and

play28:26

essentially to remove their agency so

play28:29

please anything think about working on

play28:32

things like face telescopes and claret

play28:34

area and projects like this that

play28:36

actually will insist us of getting out

play28:38

of the morass that were presently in

play28:39

thank you very much

play28:42

you

play28:51

you

Rate This

5.0 / 5 (0 votes)

Related Tags
Technology ImpactSocietal TrendsNew Dark AgeCloud ComputingClimate ChangeData PredictionEthical TechAI LimitationsAutomation BiasDigital Opacity